Multimodal Guitar: Performance Toolbox and Study Workbench
نویسندگان
چکیده
This project aims at studying how recent interactive and interaction technologies would help extend how we play the guitar, thus defining the “multimodal guitar”. We investigate two axes, 1) “A gestural/polyphonic sensing/processing toolbox to augment guitar performances”, and 2) “An interactive guitar score following environment for adaptive learning”. These approaches share quite similar technological challenges (sensing, analysis, processing, synthesis and interaction methods) and dissemination intentions (community-based, low-cost, open-source whenever possible), while leading to different applications (respectively artistic and educational), still targeted towards experienced players and beginners. We designed and developed a toolbox for multimodal guitar performances containing the following tools: Polyphonic Pitch Estimation (see section 3.1.1), Fretboard Grouping (see section 3.1.2), Rear-mounted Pressure Sensors (see section 3.2), Infinite Sustain (see section 3.3.2), Rearranging Looper (see section 3.3.3), Smart Harmonizer (see section 3.3.4). The Modal Synthesis tool (see section 3.3.1) needs be refined before being released. We designed a low-cost offline system for guitar score following (see section 4.1). An audio modality, polyphonic pitch estimation from a monophonic audio signal, is the main source of the information (see section 4.2), while the visual input modality, finger and headstock tracking using computer vision techniques on two webcams, provides the complementary information (see section 4.3). We built a stable data acquisition approach towards low information loss (see section 4.5). We built a probability-based fusion scheme so as to handle missing data; and unexpected or misinterpreted results from single modalities so as to have better multi-pitch transcription results (see section 4.4). We designed a visual output modality so as to visualize simultaneously the guitar score and feedback from the score following evaluation (see section 4.6). The audio modality and parts of the visual input modality are already designed to run in realtime, we need to improve the multimodal fusion and visualization so that the whole system can run in realtime.
منابع مشابه
Project #03 Multimodal Guitar: Performance Toolbox and Study Workbench
This project aims at studying how recent interactive and interaction technologies would help extend how we play the guitar, thus defining the “multimodal guitar”. We investigate two axes, 1) “A gestural/polyphonic sensing/processing toolbox to augment guitar performances”, and 2) “An interactive guitar score following environment for adaptive learning”. These approaches share quite similar tech...
متن کاملMultimodal Guitar: A Toolbox For Augmented Guitar Performances
This project aims at studying how recent interactive and interactions technologies would help extend how we play the guitar, thus defining the “multimodal guitar”. Our contributions target three main axes: audio analysis, gestural control and audio synthesis. For this purpose, we designed and developed a freely-available toolbox for augmented guitar performances, compliant with the PureData and...
متن کاملEstimation of Guitar Fingering and Plucking Controls based on Multimodal Analysis of Motion, Audio and Musical Score
This work presents a method for the extraction of instrumental controls during guitar performances. The method is based on the analysis of multimodal data consisting of a combination of motion capture, audio analysis and musical score. High speed video cameras based on marker identification are used to track the position of finger bones and articulations and audio is recorded with a transducer ...
متن کاملA Mobile Wireless Augmented Guitar
We present the design of a mobile augmented guitar based on traditional playing, combined with gesture-based continuous control of audio processing. Remote sound processing is enabled through our dynamically reconfigurable low-latency high-fidelity audio streaming protocol included in a mobile wearable wireless platform. Initial results show the suitability audio and sensors data forwarding ove...
متن کاملEstimation of Guitar Fingering and Plucking Controls based on Multimodal Analysis of Motion, Audio and Musical Score
This work presents a method for the extraction of instrumental controls during guitar performances. The method is based on the analysis of multimodal data consisting of a combination of motion capture, audio analysis and musical score. High speed video cameras based on marker identification are used to track the position of finger bones and articulations and audio is recorded with a transducer ...
متن کامل